Square Penalty Support Vector Regression

نویسندگان

  • Álvaro Barbero Jiménez
  • Jorge López Lázaro
  • José R. Dorronsoro
چکیده

Support Vector Regression (SVR) is usually pursued using the 2–insensitive loss function while, alternatively, the initial regression problem can be reduced to a properly defined classification one. In either case, slack variables have to be introduced in practical interesting problems, the usual choice being the consideration of linear penalties for them. In this work we shall discuss the solution of an SVR problem recasting it first as a classification problem and working with square penalties. Besides a general theoretical discussion, we shall also derive some consequences for regression problems of the coefficient structure of the resulting SVMs and illustrate the procedure on some standard problems widely used as benchmarks and also over a wind energy forecasting problem.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Application of Least Square Support Vector Machine as a Mathematical Algorithm for Diagnosing Drilling Effectivity in Shaly Formations

The problem of slow drilling in deep shale formations occurs worldwide causing significant expenses to the oil industry. Bit balling which is widely considered as the main cause of poor bit performance in shales, especially deep shales, is being drilled with water-based mud. Therefore, efforts have been made to develop a model to diagnose drilling effectivity. Hence, we arrived at graphical cor...

متن کامل

Support vector regression for prediction of gas reservoirs permeability

Reservoir permeability is a critical parameter for characterization of the hydrocarbon reservoirs. In fact, determination of permeability is a crucial task in reserve estimation, production and development. Traditional methods for permeability prediction are well log and core data analysis which are very expensive and time-consuming. Well log data is an alternative approach for prediction of pe...

متن کامل

TROP-ELM: A double-regularized ELM using LARS and Tikhonov regularization

In this paper an improvement of the optimally pruned extreme learning machine (OP-ELM) in the form of a L2 regularization penalty applied within the OP-ELM is proposed. The OP-ELM originally proposes a wrapper methodology around the extreme learning machine (ELM) meant to reduce the sensitivity of the ELM to irrelevant variables and obtain more parsimonious models thanks to neuron pruning. The ...

متن کامل

Hardness on Numerical Realization of Some Penalized Likelihood Estimators

Abstract: We show that with a class of penalty functions, numerical problems associated with the implementation of the penalized least square estimators are equivalent to the exact cover by 3-sets problem, which belongs to a class of NP-hard problems. We then extend this NP-hardness result to the cases of penalized least absolute deviation regression and penalized support vector machines. We di...

متن کامل

Robust Support Vector Machine Using Least Median Loss Penalty

It is found that data points used for training may contain outliers that can generate unpredictable disturbance for some Support Vector Machines (SVMs) classification problems. No theoretical limit for such bad influence is held in traditional convex SVM methods. We present a novel robust misclassification penalty function for SVM which is inspired by the concept of “Least Median Regression”. I...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007